10 Awful Gaming Trends We're Glad Are Gone | ScreenRant

2022-08-01 19:04:05 By : Mr. Jed Chan

From unnecessary motion controls to obnoxious monetization schemes, gamers are glad that these all-time worst gaming trends are mostly gone.

Gaming is in a weird place at the moment; while the new ninth-generation consoles promise progress on a scale never before seen, demand for the Xbox Series X and PlayStation 5 still far outpaces supply, meaning that few gamers have access to the latest and greatest in console gaming.

Yet, although a frustrating trend, the supply shortage and scalper-plagued market that has come about, as a result, isn't the worst trend gamers have endured. From abhorrent microtransaction schemes to ridiculous NFT grifts, here are 10 awful gaming trends that have either died off completely or are on the verge of kicking the bucket.

While predecessors like Power Glove and Sega Activator set a precedent for motion controls in gaming, the seventh console generation brought it about in force. Released in 2006, motion controls were the core conceit of Nintendo's Wii console, and both Microsoft and Sony followed suit, later on, introducing the Xbox Kinect and the PlayStation Move respectively.

While neat novelties, gamers quickly grew tired of forced motion control integration, particularly in third-party titles. As a result, when the eighth console generation kicked off in 2013, Nintendo adopted a new gimmick for their Wii U, and Microsoft quickly learned that gamers weren't going to spend an extra hundred dollars on an Xbox One bundled with Kinect.

Though the franchise had already attained a respectable player base beforehand, 2007 was a turning point for Activision's Call of Duty franchise. The release of Call of Duty: Modern Warfare made online multiplayer a must-have for console games, and dozens of competing franchises followed suit.

RELATED: 10 Awesome Video Game Demakes That You Can Play Right Now

From BioShock to Mass Effect, half-baked multiplayer modes were a huge part of the seventh generation of console gaming. While some of them did turn out to be memorable, they were, by and large, a waste of development time, and most of them are completely unplayable now due to their player bases being nonexistent.

A popular means of making lengthy cutscenes more engaging, quick-time events became a gaming mainstay around the turn of the century, most often associated with the cult hit pseudo-life sim Shenmue. Unfortunately, some developers came to rely on them a bit too much, with some monumental moments in franchises as notable as Resident Evil and Call of Duty falling victim to these basic button-mashing mechanics.

In the modern era, QTEs aren't unheard-of, but they certainly seem to be much rarer than they were two decades ago. The late-2000s trend of coupling QTEs and motion controls likely had a hand in killing the concept off.

From ROB the Robot to the SNES SuperScope, peripherals have been a part of video games since the rejuvenation of home console gaming in the mid-80s. However, things came to a head in the mid-2000s with the arrival of popular rhythm games such as Guitar Hero and Rock Band.

RELATED: The 11 Best Rhythm Video Games, According To Metacritic

Though they were phenomena in their own right, there was a point from 2006 to 2012 or so during which gamers' living rooms were cluttered with plastic drum kits, guitars, and turntables. The trend collapsed in the early 2010s, but plenty of gamers still probably have old plastic instruments collecting dust in their basements or attics.

Though Sony and Microsoft portrayed their seventh-generation consoles as marvels of hardware engineering, developers still had to rely on all sorts of tricks to distract from muddy textures, gnarled graphics, and other visual issues.

This often resulted in an over-reliance on "realistic" camera effects such as bloom, film grain, and lens flairs. These inclusions allowed developers to market the realism of their titles while simultaneously obfuscating some decidedly unrealistic flaws. Fortunately, in the modern era, computing power is such that techniques like these are no longer used—at least, not overbearingly so.

When smartphones entered the mainstream in the late 2000s, gaming publishers recognized a major opportunity to port fan-favorite gaming franchises to mobile devices. Unfortunately, the result of this was a slew of half-baked conversions that hardly represented their console and PC counterparts. Take a look at the travesty that was the mobile version of Resident Evil 4 to get an idea of how desperate these games often were.

Games like Plants Vs. Zombies and Cut The Rope dominated this pioneering era, and it prompted industry leaders to follow suit, eventually discovering that free-to-play titles loaded with microtransactions worked out far better in the mobile space.

Before the seventh console generation, the ability to save a game was a luxury that had yet to be afforded to every gamer. In the 80s and 90s, most titles featured tedious password systems to circumvent the lack of a battery backup, and plenty of players who cut their teeth on the first two PlayStation consoles can speak to the struggle of leaving a console on for days to rectify a missing memory card.

RELATED: 10 Best Forgotten PS2 Horror Games

Fortunately, those days are long gone, though this issue has been resurrected in a slightly different form. As download sizes balloon, gamers find themselves searching for external hard drives to compensate for a lack of onboard memory.

As microtransactions slowly swallowed up the mobile gaming market, major publishers opted to begin integrating them into AAA experiences, as well. The eighth console generation was particularly plagued by shoehorned-in cosmetic bundles and virtual currencies, and major franchises such as Call of Duty and Madden NFL were especially bogged down with these predatory schemes.

While they are still prevalent in sports games, microtransactions in AAA games have mostly fallen by the wayside in favor of battle pass systems. Though flawed, battle passes often grant avid players a chance to earn more cosmetics for a lesser cost.

Infamously re-labeled "surprise mechanics" by an EA rep during a UK parliament hearing, loot boxes were an atrocious gaming trend that infested console and mobile gaming in the mid to late 2010s. The fervor from some publishers to include these mechanics was such that highly-anticipated titles like Star Wars: Battlefront II and Middle-earth: Shadow of War all but required them.

Fortunately, fan pushback was immense, and publishers were quick to back peddle. Today, while loot boxes still exist, they aren't nearly as common as they once were and are mostly relegated to live-service multiplayer titles.

Cryptocurrencies and NFTs are all the rage among tech-savvy investors, but almost all gamers agree that they have no place in the gaming space. Yet, some publishers seem eager to get in on the next big thing and have opted to implement NFTs in some of their games.

The most infamous example of this was Ubisofts' Quartz initiative which saw NFT cosmetics roll out in the already unfavorably-received Ghost Recon: Breakpoint. Backlash combined with low sales prompted Ubisoft to backtrack, and, while publishers still threaten to integrate NFTs into their games, the window for widespread adoption of NFTs in gaming seems to have closed.

NEXT: 10 Movies That Started Huge Hollywood Trends